Skip to main content

Full text of "Measuring the perceived effectiveness of the Internal Revenue Service's (IRS) direct filing system from the end-user perspective."

DUDLE'S 

[(X» 



NAVAL POSTGRADUATE SCHOOL 

Monterey , California 




THESIS 

H IW 



MEASURING THE PERCEIVED EFFECTIVENESS OF THE 

INTERNAL REVENUE SERVICE'S (IRS) DIRECT 
FILING SYSTEM FROM THE END-USER PERSPECTIVE 



by 



Margaret Yvonne Hall 
MARCH 1991 



Thesis Advisor: 



Tung Bui 



Approved for public release: Distribution is unlimited 



1^53928 



SECURITY CLASSIFICATION OF THIS PAGE 



REPORT DOCUMENTATION PAGE 



1a REPORT SECURITY CLASSIFICATION 
Unclassified 



lb RESTRICTIVE MARKINGS 



2a. SECURITY CLASSIFICATION AUTHORITY 



2b DECLASSIFICATION/DOWNGRADING SCHEDULE 



3 DISTRIBUTION/AVAILABILITY OF REPORT 
Approved for public release; distribution is unlimited. 



4 PERFORMING ORGANIZATION REPORT NUMBER(S) 



5 MONITORING ORGANIZATION REPORT NUMBER(S) 



6a NAME OF PERFORMING ORGANIZATION 
Naval Postgraduate School 



6b OFFICE SYMBOL 
(If applicable) 
AS 



7a NAME OF MONITORING ORGANIZATION 
Naval Postgraduate School 



6c ADDRESS (C/fy, State, and ZIP Code) 
Monterey, CA 93943-5000 



7b ADDRESS (City, State, and ZIP Code) 
Monterey, CA 93943-5000 



8a NAME OF FUNDING/SPONSORING 
ORGANIZATION 



8b OFFICE SYMBOL 
(If applicable) 



9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER 



8c ADDRESS (City, State, and ZIP Code) 



10 SOURCE OF FUNDING NUMBERS 



Program Element No 



Protect No 



las* No 



Work unit Acceiiion 
Number 



1 1 . TITLE (Include Security Classification) 

MEASURING THE PERCEIVED EFFECTIVENESS OF THE INTERNAL REVENUE SERVICE'S (IRS) DIRECT FILING SYSTEM FROM 

THE END-USER PERSPECTIVE 



12. PERSONAL AUTHOR(S) Hall, Margaret Yvonne 



13a TYPE OF REPORT 
Master's Thesis 



13b TIME COVERED 
From To 



14 DATE OF REPORT (year, month, day) 
1991 March 



15 PAGE COUNT 
63 



16. SUPPLEMENTARY NOTATION 

The views expressed in this thesis are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. 

Government. 



17 COSATI CODES 



FIELD 



GROUP 



SUBGROUP 



1 8 SUBJECT TERMS (continue on reverse if necessary and identify by block number) 
Electronic Filing System, measuring the effectiveness of MIS 



19. ABSTRACT (continue on reverse if necessary and identify by block number) 

Although the objectives and potential benefits are clear for the Internal Revenue Service and some software 
developers on promoting Electronic Filing Systems (EFS), it is not that obvious for tax preparers and the public at 
large. As a consequence, the current rate of EFS usage is still below expectation. Based on a study on the 
Measurement of End-User Computing Satisfaction, an empirical survey was conducted among the tax preparers 
community in Central California to determine factors that could help increase EFS use. Our findings seem to 
confirm the results cited in the End-User Commuting literature. Software reliability, flexibility, efficiency and ease 
of use, quality of documentation, ability to make corrections, and timeliness were the most relevant findings. These 
factors received high scores from the interviewees. Nevertheless, training appears to be a crucial factor to convince 
tax preparers of the reliability of EFS, and tax preparers should be encouraged to devote more time in getting 
acquainted with the documentation provided by the IRS which was generally perceived as satisfactory. Another 
concern in EFS use was cost, although this factor was not included in the statistical analysis. 



20. DISTRIBUTION/AVAILABILITY OF ABSTRACT 

] UNCLASSIFIED/UNLIMITED ] SAME AS REPORT Q DTIC USERS 

22a. NAME OF RESPONSIBLE INDIVIDUAL 

Tung Bui 



21 . ABSTRACT SECURITY CLASSIFICATION 
Unclassified 



22b TELEPHONE (Include Area code) 
(408)646-2630 



22c. OFFICE SYMBOL 
AS/Bd 



DD FORM 1473, 84 MAR 



83 APR edi tion may be used until exhausted 
All other editions are obsolete 



SECURITY CLASSIFICATION OF THIS PAGE 
Unclassified 



Approved for public release: Distribution is unlimited 

Measuring the Perceived Effectiveness 
of the Internal Revenue Service's (IRS) Direct Filing System 

from the End-User Perspective 

by 

Margaret Yvonne Hall 

Lieutenant, United States Navy 

B.A. , Benedict College, 1979 

Submitted in partial fulfillment of the 
requirements for the degree of 

MASTER OF SCIENCE 
IN 1 1 FORMATION SYSTEMS 

from the 

NAVAL POSTGRADUATE SCHOOL 

MARCH 1991 



r\ 



J. 



David R. Whipple, Chairman 
Department of Administrative Sciences 



11 



ABSTRACT 

Although the objectives and potential benefits are clear 
for the Internal Revenue Service and some software developers 
on promoting Electronic Filing Systems (EFS) , it is not that 
obvious for tax preparers and the public at large. As a 
consequence, the current rate of EFS usage is still below 
expectation. Based on a study on the Measurement of End-User 
Computing Satisfaction, an empirical survey was conducted 
among the tax preparers community in Central California to 
determine factors that could help increase EFS use. Our 
findings seem to confirm the results cited in the End-User 
Computing literature. Software reliability, flexibility, 
efficiency and ease of use, quality of documentation, ability 
to make corrections, and timeliness were the most relevant 
findings. These factors received high scores from the 
interviewees. Nevertheless, training appears to be a crucial 
factor to convince tax preparers of the reliability of EFS, 
and tax preparers should be encouraged to devote more time in 
getting acquainted with the documentation provided by the IRS 
which was generally perceived as satisfactory. Another 
concern in EFS use was cost, although this factor was not 
included in the statistical analysis. 



111 



HH97 

6. 1 



TABLE OF CONTENTS 

I. INTRODUCTION 1 

A. OBJECTIVE STUDY 1 

B. PROBLEM STATEMENT 3 

C. METHODOLOGY 5 

D. SCOPE AND LIMITATIONS 5 

II. BACKGROUND 6 

A. BRIEF EXPLANATION OF EFS 6 

B. PROBLEMS OF EFS 9 

C. RESEARCH QUESTIONS 10 

III. A FRAMEWORK OF MEASURING END-USER 11 

A. INTRODUCTION TO END-USER COMPUTING 
SATISFACTION 11 

B. MEASURING END-USER COMPUTING SATISFACTION . . 12 

IV. RESEARCH DESIGN 18 

A. PROBLEM STATEMENT AND RESEARCH QUESTIONS . . 18 

B. SAMPLE CHARACTERISTICS 18 

C. SURVEY METHOD 19 

D. STATISTICAL METHODS 20 

1. Mean Values and Standard Deviation ... 20 

2. Linear Correlation 21 

3. Factor Analysis 22 

4. Linear Regressions 2 3 

E. DATA ANALYSIS 23 

1. Mean Values and Standard Deviations . . 2 3 

2. Cc relation Matrix 23 

3. Factor Analysis 32 

4. Linear Regression 39 

VI. ■ SUMMARY: 43 

A. SUMMARY OF RESULTS 43 

B. SUGGESTIONS FOR FUTURE RESEARCH 43 

VII. CONCLUSION 45 

APPENDIX END-USER SATISFACTION SURVEY OF THE IRS 

ELECTRONIC FILING SYSTEM PROGRAM (EFS) ... 46 

REFERENCES 51 

BIBLIOGRAPHY 53 

INITIAL DISTRIBUTION LIST 55 



IV 



LIST OF FIGURES 

t 

Figure 1 Summary of Model 13 

Figure 2 Summary of the Model 42 



LIST OF TABLES 



TABLE I MEAN SCORES AND STANDARD DEVIATIONS 

TABLE II CORRELATION ANALYSIS 

TABLE II CORRELATION ANALYSIS (continued) . . . 
TABLE III INITIAL FACTOR METHOD: PRINCIPAL 

COMPONENTS FACTOR PATTERN 

TABLE IV ROTATION METHOD: VARIMAX ROTATED FACTOR 

PATTERN 

TABLE V VARIANCE EXPLAINED BY EACH FA TOR . . 
TABLE VI REGRESSION MODELS FOR ^EPENDENT 

VARIABLE: MODEL: MODEL 1 

TABLE VII REGRESSION MODELS FOR DEPENDENT 

VARIABLE: RETURNS 

MODEL: MODEL 1 



24 
25 
26 

33 

34 
35 

40 



41 



VI 



I . INTRODUCTION 

Although the objectives and potential benefits are clear 
for the Internal Revenue Service and some software developers 
to promote Electronic Filing Systems, it is not that obvious 
for tax preparers and the public at large. The literature 
advocates that to guarantee a successful implementation of 
Information Systems, the Information System's goals should 
perfectly reflect the goals of the organization [Ref . 1] . The 
Electronic Filing System involves at least four user 
communities: IRS, Tax preparers, the Public and software 
designers. Under the presence of multiple entities coupled 
with the proliferation rate of end-user computers, it is 
crucial to understand how a consensus can be reached. To 
address this issue, each of the following research issues had 
to be explored for each of the constituencies. "Is the system 
effective and if so, to what degree are the end-users 
satisfied?" "What is meant by effectiveness?" and "What 
factors are used to measure effectiveness?" 

A. OBJECTIVE STUDY 

The literature suggests a number of criteria for success 
of Management Information Systems-profitability [Ref. 2], end- 
user satisfaction [Refs. 1, 3, 4, 5], system usage [Refs. 6, 
7, 8, 9, 10], performance [Ref. 6] and job satisfaction [Ref. 



6]. Of these perspectives, end-user satisfaction and system 
usage are the most widely used measures [Refs. 2, 11] and thus 
have been adopted by this study as the indicators of the 
perceived effectiveness or success of Management Information 
Systems. 

End-user computing satisfaction (EUCS) can be measured in 
a decision-making organization [Ref . 1] . An end-user 
application's utility in decision making is enhanced when the 
outputs meet the user's expected information needs and the 
application is easy to use [Ref. 1] . "Ease of use" or "user 
friendliness" is especially important in organizations 
facilitating voluntary use of inquiry of decision support 
systems, or Management Information Systems [Ref. 1] . In 
voluntary situations, system usage can also be a measure of 
system success. Barpido, et al., (1986) suggest that, 
"satisfaction leads to usage rather than usage stimulating 
satisfaction." Thus, user satisfaction may be the critical 
factor in measuring the success of Information Systems. Most 
studies conducted in the realm of Information Systems has 
focused on general satisfaction rather than on a specific 
software application, and they have omitted aspects important 
to end-user computing such as ease of use [Ref. 1] . A study on 
the Measurement of End-User Computing Satisfaction by Doll and 
Torkzadeh distinguished between user information satisfaction 
and an end-user's satisfaction with a specific application 
[Ref. 1] . Their study attempted to measure the satisfaction of 



users who directly interact with a specific application [Ref . 
1]. The results of their study identified five factors which 
they considered should be used as a standard instrument for 
measuring end-user computing satisfaction. The factors were 
content, accuracy, format, ease of use, and timeliness. [Ref. 

1] 

This study will replicate the Doll and Torkzadeh study 
[Ref. 1] to determine whether or not the same findings would 
apply in the Internal Revenue Service's case. The focus was on 
measuring EUCS among tax preparers. The goal of this study was 
to develop an instrument that: 



Focuses on satisfaction with the information 
product provided by a specific application. 

Includes items to evaluate the ease of use of a 
specific application. 

Determines the factors end-users perceive 
important in measuring EUCS. 



B. PROBLEM STATEMENT 

In any end-user computing environment, decision makers 
interact directly with the application software to enter 
information or prepare output reports. The environment will 
typically include a database, a model base, and an interactive 
software system that enables the user to directly interact 
with the computer system. In an end-user computing 
environment, analysts, programmers and operations staff are 



less directly involved in user support; and as a result, users 
assume more responsibility for their own applications. System 
personnel might assist in the selection of appropriate 
software tools, but the end-users are largely on their own to 
implement, modify, and run their own applications. "Training 
programs, experienced colleagues, and manuals provide some 
assistance. However, the goal of Management Information System 
staff and service policies typically focuses on enabling end- 
users to function more independently to solve many pr .lems 
on their own." Ease of use has become increasingly important 
in software design (Branscomp and Thomas, 1984) . There is 
increasing evidence that the effective functioning of an 
application depends on its ease of use or usability (Goodwin, 
1987) . "If end-users find an application easy to use, they may 
become more advanced users, and therefore, better able to take 
advantage of the range of capabilities the software has to 
offer. Also, ease of use may improve organizational 
productivity [Ref. 1]." This study is based on our belief that 
there are certain underlying factors which should be 
considered when promoting effective Information Systems. The 
Internal Revenue Service's Electronic Filing System was used 
to study these factors. 



C. METHODOLOGY 

The sample size was limited to Central California. Factor 
Analysis and Linear Regression was used to analyze the data 
collected from the questionnaire. 

D. SCOPE AND LIMITATIONS 

Electronic Filing System encompasses different categories 
of users: IRS, taxpayers, software users and the public at 
large. This thesis focuses only on the tax preparer community. 
Our assumption was based on the fact that most tax-payers who 
wish to use EFS would go through a professional tax preparer. 
Since EFS implementation is still in its early phase, the 
results of our study may not be extrapolated. 



II. BACKGROUND 

A. BRIEF EXPLANATION OF EFS 

The electronic filing of federal individual income tax 
returns began as a pilot in 1986, when five participants in 
three metropolitan areas filed more than 25,000 electronic tax 
returns. It became an operational system in 1987 with 78,000 
returns filed by 60 practitioners in seven metropolitan areas. 
Today electronic filing is available in all 50 states. 
Electronic filing allows tax returns to be filed with the 
Internal Revenue Service via telephone lines with the use of 
modems. The Electronic Filing Systems are composed of three 
subsystems: (1) Communications Subsystem, which is comprised 
of both DIAL-UP and LEASED (dedicated modems, to allow the tax 
preparer/transmitter to call into the IRS computer) . The tax- 
preparer/transmitter sends a tax return via modem to a 
communications processor. The processor receives the tax data 
to merge it into the processing subsystem and determine if a 
tax return has been received, and the creation of an 
acknowledgement tape file. (2) Processing subsystem consists 
of a UNISYS 1180 Mainframe computer and a series of COBOL 
programs to formulate and validate the information received by 
the Communications Subsystem. The processing steps are: 



• Expansion creates fixed records. 

• Return validation — determines if there is a complete tax 
return. 

< 

• Assigns DLN. 

• Block Control — assigns a document control number. 

• Code and Edit. 

• Return record — creation of what the electronic record 
becomes. 

• DIS tape — this tape is meshed with the key-entered paper 
return tax data to create one tape for the National 
Computer Center to release refunds. 

• Acknowledgement record — creates and acknowledges the 
acceptance of every electronically filed return back to 
the user. 

(3) Archival/retrieval subsystem is a long-term and short-term 
storage vehicle for all electronically filed returns. 
Preparers/transmitters apply for admission into the program 
and send test transmission to determine the quality of their 
operation. Once accepted into the program they may transmit 
return data directly to the IRS via modem. This return data is 
processed and if valid, an acknowledgement file is sent to the 
transmitter. Error free returns are archived to optical disk 
for long-term storage. Returns that need corrections or 
adjustments are temporarily stored on disk to allow the tax 
examiner to perfect errors. Tax examiners use computers to 
"call-up" the tax return and make all corrections by creating 
a shadow page of the return. 



This system eliminates many time-ccnsuming manual 
processes, decreases the chances of error, and speeds the 
delivery of tax refunds to the American public. The goal — 
less paper, less time-consuming and error prone manual 
processing. The electronic filing system offers many 
advantages to the electronic filer who: 

• is saved from paperwork 

• realizes lower mailing cost 

• improves the quality of services provided and, 

• receives a competitive edge. 

The Internal Revenue Service benefits by: 

• the elimination of error prone and time consuming 
processes 

• improved quality 

• enhanced service to the public and 

• an expanded capacity for operation 

The taxpayer also benefits from: 

• a return processed with less chance of error 

• acknowledgement that the tax return was received and, 

• a refund up to three weeks sooner than those from paper 
returns . 



8 



B. PROBLEMS OF EFS 

Despite the aggressive policy by the IRS to promote 
implementation and use of EFS throughout the nation, effective 
use of the system can be perceived as relatively low. 
According to the Commissioner's briefing on EFS, many private, 
not-for-profit organizations, and other government agencies 
have asked why the IRS has not developed software for their 
use. The two primary reasons the EFS office has not done this 
are (1) to avoid competition between the government and 
private industry in software development, and (2) the 
government cannot be held accountable for erroneous 
submissions. By not becoming involved in actual return 
preparation software, the IRS can maintain its proper autonomy 
in meeting its mission of Tax Administration. Although 
productivity and service to the taxpayers has far exceeded 
expectations, several additional problems were addressed in 
the report [Ref. 12]. It is important to note according to the 
commissioner's report that none of these problems are a result 
of system problems with Electronic Filing. Some of the 
problems do involve electronically filed returns. Here are 
some of the more notable problems and the circumstances 
related to them: 

• Improper handling of tapes 

• Timely receipt of refunds for electronic returns — there 
have been several cases where taxpayers expressed concern 
that they had not received their refund in the two-to- 
three week period suggested by EFS. The major reason for 



this circumstance is the filer's submission of the return 
to IRS. Filers are instructed to file returns timely and 
are monitored by the IRS personnel to ensure that they 
comply with this requirement. Most filers comply with this 
procedure, which results in timely acknowledgement by the 
IRS. The two-to-three week processing of the refund by EFS 
begins with the acknowledgement date. In cases where a 
filer delays submitting the return timely, it can increase 
the refund processing time by an additional week or more. 
This delay is a result of the processing cycle for all 
returns, paper and electronic, and is not a function of 
the Electronic Filing System itself. 

Difficulty in the posting of some Direct Deposit Payments 
with the Financial Management Service (FMS) . The most 
prevalent circumstances are an error in the bank's routing 
transit number or an incorrect account number. Both of 
these numbers are provided by the taxpayers. When an 
erroneous number is provided the result is a non-posting 
of the refund, the problem is the inability of FMS to 
properly and timely post a credit to IRS for the returned 
Direct Deposit, so IRS can issue a paper check. 



C. RESEARCH QUESTIONS 

This study will explore the set of problems presented in 
the Commissioner's report and determine from the end-user's 
(tax preparers) perspective what they feel slowed the 
promotion of EFS, the factors they consider important in the 
successful implementation of EFS, and what can be done to 
improve system acceptability and usage. 



10 



III. A FRAMEWORK OF MEASURING END-USER 
COMPUTING SATISFACTION 

A. INTRODUCTION TO END-USER COMPUTING SATISFACTION 

The Management Information Systems Literature suggests 
various perspectives on measures of EUCS effectiveness: user 
satisfaction [Ref. 1, 5], which suggest a 12-item instrument 
that measures five components of end-user satisfaction — 
content, accuracy, format, ease of use, and timeliness, and 
others which test several hypotheses such as the greater the 
perceived user friendliness of the software tool(s) used, the 
greater the overall user satisfaction. If the user's attitude 
toward computer applications is a positive one the higher will 
be the degree of overall user satisfaction, and the computer 
background of the user will also exert a moderating effect on 
the user's perception of the user friendliness of a software 
tool. System usage [Ref. 6, 9], which measures management use 
of the Information Systems and the impact of the Information 
System on organizational performance and the impact of a 
Management Information System on individual or organizational 
performance, also has a profound impact on measuring system 
success. Profitability [Ref. 1], in which three levels of 
measures were identified, operational, managerial, and 
strategic; and performance [Ref. 6] are also good measures. 



11 



But of these, user satisfaction has been adopted by his study 
as an indicator of EUC success. 

B. MEASURING END-USER COMPUTING SATISFACTION 

According to Benson (1983) and Lefkovitis (1979) 

"end-user computing has proliferated in the last ten 
years, and although in its early stages, signs of rapid 
growth are still evident. Rockart and Flannery (1983) 
found annual EUC growth rates of 50 percent to 90 percent. 
Benjamin (1982) has predicted that by 1990 EUC will absorb 
as much as 75 percent of the corporate computer budget. 
Because of these trends, Rockart and Flannery call for 
better management to improve the success of end-user 
computing. To improve management of EUC, Cheney (1986) 
call for more empirical research on the factors which 
influence the success of end-user computing. Henderson and 
Treacy (1986) describe a sequence of perspectives 
(implementation, marketing, operation, and economic) for 
managing end-user computing and identifying objectives for 
each phase. In the implementation phase, they maintain 
that objectives should focus on increased usage and user 
satisfaction. " 

As previously stated, Doll and Torkzadeh [Ref. 1] 
contrasts traditional versus end-user computing environments 
and reports on the development of an instrument which merges 
ease of use and information product items to measure the 
satisfaction of users who directly interact with the computer 
for a specific application. The researchers surveyed 618 end- 
users, and the results of their survey is contained in Figure 
1. 

The model suggest a 12 -item instrument that measures five 
components of end-user satisfaction-content, accuracy, format, 
ease of use, and timeliness. The specific goals of their 
research were to develop an instrument that: 



12 




Figure 1. Summary of Model 



CONTENT 

Cl: Does the system provide the precise information you need? 

C2: Does the information content meet your needs? 

C3: Does the system provide reports that seem to be just about what you need? 

C4: Does the system provide Sufficient information? 

EASE OF USE 

El: Is the system user friendly? 
E2: Is the system easy use? 

ACCURACY 

Al: Is the system accurate? 

A2: Are you satisfied with the accuracy of the system? 

FORMAT 

Fl: Do you think the output is presented in a useful format? 
F2; Is the information clear? 

TIMELINESS 

Tl: Do you get the information you need in time? 
T2; Does the system provide up-to-date information? 



13 



Focuses on satisfaction with the information product 
provided by a specific application; 

Includes items to evaluate the ease of use of a specific 
application; 

Provides Likert-type scales as an alternative to semantic 
differential scaling; 

Is short, easy to use, and appropriate for both academic 
research and practice; 

Can be used with confidence across a variety of 
applications and 

Enables researchers to explore the relationships be een 
end-user computing satisfaction and plausible indeper lent 
variables. 



An additional goal for the researchers was to identify 
underlying factors or components of end-user computing 
satisfaction. In developing the model for measuring end-user 
computing satisfaction, the researchers developed a 4 0-item 
instrument using a five point Likert-type scale, where 1 = 
almost never; 2 = some of the time; 3 = about half of the 
time; 4 = most of the time; and 5 = almost alv s. The 
instructions requested the users to wri in the name of their 
specific application and, for each question, to circle the 
response which best described their satisfaction with this 
application. Next, a structured interview questionnaire was 
developed where users were asked open-ended questions such as: 
How satisfied were they with the application? What aspects of 
the application, if any, were they most satisfied with and 
why? What aspects of the application, if any, were they most 



14 



dissatisfied with and why? Correlation was done on each item 
against two corrected items, and items were eliminated if 
their correlation with the corrected item total was below .5 
or if their correlation with the two-item criterion scale was 
below .4. The cutoffs the researchers chose were arbitrary for 
there is no accepted standards. The cutoffs were considered 
high enough to ensure that the items retained were adequate 
measures of the end-user computing satisfaction construct. 
These two criteria enabled the researchers to reduce the 3 8 
items to 23. Five additional items were deleted because they 
represented the same aspects with only slightly different 
wordings. In each case, the wording with the lowest corrected 
item total correlation was deleted. In the pilot study, the 
remaining 18 items had a reliability (Cronbach's alpha) of .94 
and a correlation of .81 with the two-item criterion scale. 
The researchers conducted an exploratory factor analysis and 
modified the instrument, examined discriminant validity of the 
modified instrument, and assessed reliability and criterion- 
related validity by nature and type of application. Factor 
analysis was used to identify the underlying factors or 
components of end-users satisfaction that comprise the domain 
of the end-user satisfaction construct. Items which were not 
factorially pure were eliminated to form a modified 
instrument. Using the sample of 618 responses, the data was 
examined using principal components analysis as the extraction 
technique and varimax as a method of rotation. Without 

15 



specifying the number of factors, three factors with eigne 
values greater than one emerged. These factors were 
interpreted as content/ format, accuracy/timeliness, and ease 
of use/efficiency. In order for the researchers to achieve 
more precise and interpretable factors, the analysis was 
conducted specifying two, four, five and six factors. The 
researchers felt that specifying five factors resulted in the 
most interpretable structure. These factors were interpreted 
as content, accuracy, >rmat, ease of use, and timeliness and 
explained 78.9 percent of the variance. The 12 item instrument 
had a reliability of .92 and a criterion-related validity of 
.76. The criterion was three separate measure of overall end- 
user satisfaction with the application. The reliability of 
each factor was: content = .89; accuracy = .91; format = .78; 
ease of use = .85; and timeliness = .82. The correlation of 
each factor with the criterion was: content = .69; accuracy = 
.55; format = .60; ease of use = .58,; and timeliness = .60. 
It is the opinion of the researchers -hat the instrument 
presented in this article represents substantial progress 
towards establishment of a standard instrument for measuring 
end-user satisfaction. The data according to Doll and 
Torkzadek support the construct and discriminant validity of 
the instrument. Furthermore, "the instrument appears to have 
adequate reliability and criterion-related validity across a 
variety of applications. This 12-item instrument may be 
utilized to evaluate end-user app ations. In addition to an 

16 



overall assessment, it can be used to compare end-user 
satisfaction with specific components across applications. 



17 



IV. RESEARCH DESIGN 

A. PROBLEM STATEMENT AND RESEARCH QUESTIONS 

This study mentioned earlier that despite the aggressive 
policy by the IRS to promote implementation and use of EFS 
throughout the nation, effective use of the system can be 
perceived as relatively low. The problems cited earlier are 
also somewhat of a technical nature, in that they do not 
address critical issues pertaining to Management, 
Implementation and usage of Information Systems. In particular 
the commissioner's report overlooked the aspect of end-user 
resistance to using a new technology. This study will 
determine from the end-user (tax preparers) perspective the 
reasons they felt slowed the promotion of EFS, and the factors 
they consider important in the successful implementation of 
EFS and what can be done to improve syr 2m acceptability and 
usage. 

B. SAMPLE CHARACTERISTICS 

Data was gathered from the use of a questionnaire that was 
administered to 300 different firms — all tax preparers 
located in the Central California area. The size of the firms 
ranged anywhere from a one person operation to a firm with 22 
employees. A sample of 63 end-users actually responded to the 
questionnaire holding various positions in the firm — CPA, 



18 



President, Tax preparer, Office Manager, Principal, Manager of 
Computer Department; with anywhere from one to two years 
experience with the Electronic Filing System. The two types of 
computers used by the firms were Macintosh and IBM clones. 
Fifty-nine percent of the respondents used 2400 modems and 41% 
used 4800 modems. Fifty-seven percent of the respondents 
filled directly using software applications such as Lacrete, 
Drake, Am West, Orr Tax, Taxware, Taxview and Computer Craft. 
The remaining 43% filled electronically using a Tax service. 

C. SURVEY METHOD 

The data collected are based on the questionnaire borrowed 
from the work of Doll and Torkzadeh [Ref . 1] . The advantage of 
using this questionnaire is that it has been validated and 
successfully tested by the previous researchers. However, the 
questionnaire has been slightly modified to better represent 
the EFS and tax-preparers • environment. Section I of the 
questionnaire dealt more with demographics, and the hardware 
environment of the firm such as type of computer and modem 
used. Section II of the questionnaire was developed using 4 
items which actually measured end-user computing satisfaction 
using a six-point Likert-type scale, where = not applicable; 
1 = almost never; 2 = some of the time; 3 = about half of the 
time; 4 = most of the time and 5 = almost always; The end- 
users were then asked to mark an X in the box which best 
describe their satisfaction with their application. Section 



19 



Ill was developed where users were asked open-ended questions 
such as: What aspects of the application, if any, were they 
most satisfied with? What aspects of the application, if any, 
were they most dissatisfied with? In their opinion what were 
the most important factors in promoting a successful use of 
EFS in their company? And, finally, the interviewees were 
asked for other comments. 

D. STATISTICAL METH 3 

The standard sta .tical met is that were appli* to this 
research were mean values and standard deviations, linear 
correlation, factor analysis and linear regression using R- 
squared analysis. The SAS Program was used for factor 
analysis, and linear regression, and the statistical program, 
Minitab, was used to obtain results for Mean Values, Standard 
Deviations, and Linear Correlation. 

1. Mean Values and Standard Deviation 

Table I represents several descriptive measures of the 
data set pertaining to question 1 through question 40. The 
data indicate where the center or most typical value of a data 
set lies. The first entry of the output gives the number of 
pieces of data, which in this case for question 1 is 51. The 
next two entries displays the mean (MEAN) and median (MEDIAN) 
of the data set. In the fourth entry, labelled TRMEAN, we find 
the 5% trimmed mean. The fifth entry, STDEV, gives the sample 
standard deviation of the data. SEMEAN, shown next, stands for 



20 



"standard error of the mean." MEAN of the data set is defined 
to be the sum of the data divided by the number of pieces of 
data (average) . For question 1 of the survey we found the mean 
to be 3.00 which indicates that 51 of the people surveyed felt 
that the system was flexible about half of the time. The 
MEDIAN of the data set is the number that divides the bottom 
50% of the data from the top 50%. For question 1 of the survey 
we found the mean to be 3.00. For the purpose of this study 
the median would be more of an appropriate measure, because 
the median is not affected strongly by the relatively few 
surveys with extremely high or low responses, whereas the mean 
would be. Thus, the median provides a better indication in 
question 1 of how flexible end-users felt the system to be 
than the mean. The TRMEAN eliminates the bottom 5% and top 5% 
of the data before the mean is calculated. The 8TDEV (standard 
deviation) measures the variation in a data set and determines 
how far the data value are from the mean, on the average. The 
8 E ME AN (sampling error of the mean) indicates the amount of 
error that resulted from the sampling. 
2. Linear Correlation 

The linear correlation coefficient represented in 
Table II describes the strength of the linear (straight line) 
relationship between two variables. The linear correlation 
coefficient, r, is always between -1 and 1. Values of r close 
to -1 or +1 indicate a strong linear relationship between the 



21 



variables and that the variable x is a good linear predictor 
of the variable y-that is, the regression equation is quite 
useful for making predictions. On the other hand values of r 
near indicate a weak linear relationship between the 
variables and that the variable x is not too useful as a 
linear predictor of the variable y-that is, the regression 
equation is not very valuable for making predictions. Positive 
values of r suggest that the variables are positively linearly 
correlated, meaning that y tends to increase Linearly as 
increased, with the tendency being greater the closer that r 
is to 1. Negative values of r suggest that the variables are 
negatively linearly correlated, meaning that y tends to 
decrease linearly as x increases, with the tendency being 
greater the closer that r is to -1. If the value of r is near 
0, then the slope of the regression line is also near 0, thus 
indicating that there is probably no linear relationship 
between the variables. Coefficient values of .7 and higher 
were chosen from the data set represented in Table II for 
further examination. This cutoff was arbitrary: for there are 
no adopted standards. 
3. Factor Analysis 

Factor analysis enabled us to see whether or not some 
underlying pattern of relationships existed such that the data 
could be "rearranged" or "reduced" to a smaller set of factors 
or components that may be taken as source variables accounting 



22 



for the observed interrelations in the data. The three steps 
used were: 

• The preparation of the correlation matrix. 

• The extraction of the initial factors -- the exploration 
of possible data reduction, and 

• The rotation to a terminal solution — the search for 
simple and interpretable factors. The principle component 
analysis extracted initial factors in a way that made them 
independent from the others; that is factors are 
orthogonal. This approach shows the best linear 
combination of the variables — best in the sense that the 
particular combination of variables would account for more 
of the variance in the data as a whole than any other 
linear combination of variables. 

4. Linear Regressions 

R-squared was used in linear regression as an attempt 
to measure the proportion of variance in one variable 
"explained" by the other. 

E. DATA ANALYSIS 

1. Mean Values and Standard Deviations 

The mean scores of the 40 questions and their 
respective standard deviations are provided in Table I. 

2. Correlation Matrix 

A correlation analysis of the data gathered from the 
questionnaire is represented in Table II. 

Question 1 correlated with question (3) which 
indicates that there is a strong linear relationship between 
system flexibility and output. This correlation coefficient 



23 



TABLE I 
MEAN SCORES AND STANDARD DEVIATIONS 





N 


N» 


MEAN 


MEDIAN 


TRMEAN 


STDEV 


SEMEAN 


Ql 


51 


12 


3.000 


3.000 


3.000 


1.600 


0.224 


Q2 


61 


2 


2.574 


2000 


2527 


1.190 


0.152 


Q3 


57 


6 


3.404 


4.000 


3.451 


1.545 


0.205 


Q4 


56 


7 


3.536 


4.000 


3.600 


1.464 


0.196 


Q5 


58 


5 


3.534 


4.000 


3.596 


1.392 


0.183 


Q6 


56 


7 


2232 


2000 


2140 


1.388 


0.185 


Q7 


59 


4 


3.966 


4.000 


4.075 


1.313 


0.171 


Q8 


60 


3 


3.633 


4.000 


3.704 


1.365 


0.176 


Q9 


59 


4 


3.729 


4.000 


3.811 


1.362 


0.177 


Q10 


57 


6 


4.211 


5.000 


4.333 


1.114 


0.148 


Qll 


58 


5 


3.431 


4.000 


3.481 


1.464 


0.192 


Q12 


55 


8 


3.545 


4.000 


3.612 


1.317 


0.178 


Q13 


58 


5 


3.810 


4.000 


3.904 


1.331 


0.175 


Q14 


55 


8 


3.364 


4.000 


3.408 


1.393 


0.188 


Q15 


55 


8 


3.764 


4.000 


3.857 


1.261 


0.170 


Q16 


55 


8 


3.691 


4.000 


3.776 


1.303 


0.176 


Q17 


51 


12 


1.843 


1.000 


1.689 


1.286 


0.180 


Q18 


58 


5 


3.259 


3.500 


3.288 


1.482 


0.195 


Q19 


62 


1 


3.323 


4.000 


3.357 


1.534 


0.195 


Q20 


62 


1 


3.387 


4.000 


3.429 


1.561 


0.198 


Q21 


52 


11 


3.615 


4.000 


3.696 


1.360 


0.189 


Q22 


59 


4 


3.610 


4.000 


3.679 


1.414 


0.184 


Q23 


59 


4 


3.627 


4.000 


3.698 


1.285 


0.167 


Q24 


59 


4 


2.254 


2000 


2170 


1.372 


0.179 


Q25 


60 


3 


3.483 


4.000 


3.537 


1.444 


0.186 


Q26 


58 


5 


2328 


2000 


2250 


1.419 


0.186 


Q27 


56 


7 


3.571 


4.000 


3.640 


1.291 


0.173 


Q28 


62 


1 


3.758 


4.000 


3.839 


1.363 


0.173 


Q29 


51 


12 


2667 


2000 


2622 


1.519 


0.213 


Q30 


57 


6 


3.474 


4.000 


3.529 


1.377 


0.182 


Q31 


55 


8 


1.855 


1.000 


1.714 


1.193 


0.161 


Q32 


59 


4 


3.864 


4.000 


3.962 


1.332 


0.173 


Q33 


52 


11 


3.038 


3.000 


3.043 


1.546 


0.214 


Q34 


52 


11 


2769 


3.000 


2739 


1.516 


0.210 


i Q35 


61 


2 


3.246 


4.000 


3.273 


1.535 


0.196 


Q36 


54 


9 


2833 


3.000 


2812 


1.299 


0.177 


Q37 


57 


6 


3.281 


3.000 


3.314 


1.411 


0.187 


Q38 


34 


29 


3.235 


3.500 


3.267 


1.257 


0.216 


Q39 


57 


6 


3.509 


4.000 


3.569 


1.269 


0.168 


Q40 


61 


2 


3.148 


3.000 


3.164 


1.276 


0.163 



also indicates that system output is a good predictor of 
system flexibility. 

Question 3 correlated with questions (4, 8, 19, 20), 
which indicates that there is a strong linear relationship 



24 



TABLE II 
CORRELATION ANALYSIS 





Ql 


Q2 


Q3 


Q4 


Q5 


Q6 


Q7 


Q8 


Q9 


Q10 


Qll 


Q2 


0.223 






















Q3 


0.667 


0.368 




















Q4 


0.584 


0.543 


0.778 


















OS 


0.517 


0.359 


0.673 


0.725 
















06 


-0.416 


-0.208 


-0.475 


-0.552 


-0.668 














Q7 


0.457 


0.441 


0.572 


0.747 


0.647 


-0 441 












08 


0.626 


0.321 


0.751 


0.766 


0.771 


-0.517 


0.805 










09 


0.600 


0.252 


0.627 


0.692 


0.748 


-0.445 


0.650 


0.875 








Q10 


0.372 


0.305 


0.456 


0.672 


0.666 


-0.401 


0.768 


0.770 


0.706 






Oil 


0.559 


0.225 


0.611 


0.644 


0.676 


-0.272 


0.642 


0.678 


0.600 


0.639 




Q12 


0.582 


0.125 


0.539 


0.483 


0.632 


-0.264 


0.579 


0.699 


0.689 


0.769 


0.742 


Q13 


0.573 


0.270 


0.607 


0.709 


0.696 


-0.434 


0.742 


0.831 


0.793 


0.833 


0.724 


Q14 


0.389 


0.024 


0.382 


0.272 


0.527 


-0.290 


0.416 


0.557 


0.512 


495 


0.499 


Q15 


0.577 


0.260 


0.615 


0.728 


0.784 


-0.538 


0.738 


0.897 


0.799 


0.824 


0.720 


Q16 


0.506 


0.126 


0.681 


0.650 


0.656 


-0.403 


0.677 


0.799 


0.708 


0.753 


0755 


Q17 


-0.061 


0.138 


0.122 


0.062 


0.243 


-0.110 


0.211 


0.234 


0.162 


0.184 


0.211 


Q18 


0.687 


0.161 


0.645 


0.707 


0.747 


-0.581 


0.583 


0.765 


0.728 


0.696 


0.659 


Q19 


0.628 


0.302 


0.725 


0.789 


0.678 


-0.499 


0.621 


0.724 


0.642 


0.663 


0.715 


Q20 


0.696 


0.322 


0.710 


0.835 


0.721 


-0.540 


0.683 


0.796 


0.735 


0.681 


0.695 


021 


0.683 


0.232 


0.634 


0.681 


0.726 


-0.515 


0.631 


0.796 


0.756 


0.700 


0.611 


Q22 


0.497 


0.188 


0.633 


0.737 


0.579 


-0.347 


0.504 


0.601 


0.617 


0.553 


0.655 


023 


0.712 


0.335 


0.684 


0.762 


0.725 


-0.459 


0.619 


0.788 


0.813 


0.717 


0.664 


Q24 


-0.485 


-0.115 


-0.431 


-0.552 


-0.283 


0.351 


-0.227 


-0.283 


-0.191 


-0.240 


-0.271 


025 


0.502 


0.412 


0.662 


0.785 


0.762 


-0.579 


0.607 


0.664 


0.683 


0.602 


0.588 


026 


-0.211 


-0.109 


-0.224 


-0.191 


-0.310 


0.242 


-0.076 


-0.161 


-0.062 


-0.054 


-0.147 


Q27 


0.428 


0.250 


0.506 


0.553 


0.455 


-0.250 


0.330 


0.483 


0.537 


0.505 


0.4O8 


Q28 


0.496 


0.374 


0.536 


0.685 


0.561 


-0.315 


0.580 


0.623 


0.587 


0.747 


0.657 


029 


-0.581 


-0.144 


-0.441 


-0.464 


-0.357 


0.359 


-0.198 


-0.386 


-0.302 


-0.276 


-0.366 


Q30 


0.568 


0.204 


0.526 


0.590 


0.627 


-0.344 


0.478 


0.665 


0.682 


0.669 


0.588 


031 


-0.281 


-0.177 


-0.214 


-0.325 


-0.318 


0.274 


-0.205 


-0.352 


-0.339 


-0.365 


-0.254 


032 


0.459 


0.372 


0.539 


0.674 


0.584 


-0.263 


0.643 


0.653 


0.635 


0.700 


0.681 


Q33 


-0.430 


-0.203 


-0.403 


-0.575 


-0.550 


0.555 


-0412 


-0.423 


-0.426 


-0.197 


-0.407 


034 


-0.423 


-0.300 


-0.507 


-0.486 


-0.523 


0.527 


-0.398 


-0.485 


-0.423 


-0.296 


-0.282 


Q35 


0.501 


0.235 


0.397 


0.339 


0.349 


-0.050 


0.382 


0.431 


0.370 


0.394 


0.446 


036 


0.087 


-0.091 


0.142 


0.176 


0.362 


-0.156 


0.249 


0.318 


0.307 


0.279 


0.352 


Q37 


0.528 


0.117 


0.525 


0.468 


0.483 


-0.253 


0.383 


0.531 


0.449 


0.566 


0.689 


038 


0.100 


-0.057 


0.176 


0.247 


0.436 


-0.150 


0.351 


0.305 


0.243 


0.541 


0.485 


Q39 


0.591 


0.259 


0.575 


0.640 


0.604 


-0.444 


0.540 


0.679 


0.538 


0.721 


0.592 


Q40 


0.344 


0.030 


0.311 


0.435 


0.448 


-0.367 


0.280 


0.393 


0.433 


0.342 


0.424 



between ease of error correction and clarity of information, 
system expectation, and ease of use. 

Question 4 correlated with questions ( 5, 8, 13, 15, 
18, 19, 20, 22, 23, 25), which indicates that there is a 
strong relationship between enjoyment of using the system and 
the system providing clear information, system output being 



25 



TABLE II 
CORRELATION ANALYSIS (continued) 





Q12 


Q13 


Q14 


Q15 


Q16 


Q17 


Q18 


Q19 


Q20 


Q21 


Q22 


Q13 


0.732 






















Q14 


0.788 


0.473 




















Q15 


0.813 


0.875 


0.715 


















Q16 


0.758 


0.906 


0.580 


0.872 
















Q17 


0.206 


0.173 


0.173 


0.139 


-0.024 














Q18 


0.708 


0.699 


0.515 


0.835 


0.697 


0.154 












Q19 


0.660 


0.775 


0.439 


0.788 


0.754 


0.022 


0.805 










O20 


0.670 


0.798 


0.466 


0.807 


0.748 


0.024 


0.744 


0.823 








Q21 


0.775 


0.771 


0.701 


0.878 


0.768 


0.052 


0.850 


0.825 


0.792 






Q22 


0.656 


0.643 


0.449 


0.682 


0.660 


0.097 


0.812 


0.777 


0.767 


0.767 




Q23 


0.737 


0.767 


0.521 


0.856 


0.753 


0.041 


0.787 


0.795 


0.837 


0.878 


0.746 


Q24 


-0.237 


-0.238 


-0.231 


-0.393 


-0.347 


0.12 


-0.440 


-0.506 


-0.539 


-0.431 


-0.573 


Q25 


0.548 


0.676 


0.337 


0.688 


0.633 


O.Os: 


0.680 


0.747 


0.82<i 


0.696 


0.745 


Q26 


-0.199 


-0.079 


-0.338 


■0.186 


-0.094 


O.Ou- 


-0.311 


-0.235 


-o.r 


-0.272 


-0.342 


Q27 


0.542 


0.498 


0.504 


0.595 


0.497 


0.142 


0.658 


0360 


0.: 


0.658 


0.768 


Q28 


0.647 


0.760 


0.395 


0.769 


0.745 


0.064 


0.642 


0.757 


o.: 


0.699 


0.762 


Q29 


-0.295 


-0.288 


-0.253 


-0.439 


-0.396 


0.065 


-0.547 


-0.429 


-0.4 


-0.419 


-0.567 


Q30 


1.682 


0.654 


0.490 


0.794 


0.664 


0.133 


0.723 


0.659 


0.6o 


0.760 


0.660 


Q31 


-0.265 


-0.292 


-0.252 


-0.407 


-0369 


0.180 


-0.375 


-0.416 


-0.359 


-0.499 


-0.351 


Q32 


0.626 


0.734 


0.452 


0.746 


0.755 


0.185 


0.561 


0.655 


0.689 


0.710 


0.696 


Q33 


-0.163 


-0.391 


-0.108 


-0.398 


-0370 


0.045 


-0.418 


-0.404 


-0.404 


-0.318 


-0.252 


Q34 


-0.262 


-0.423 


-0.219 


-0.409 


-0.338 


-0.066 


-0367 


-0.416 


-0.471 


-0.421 


-0304 


Q35 


0.423 


0.338 


0.402 


0.434 


0.378 


-0.104 


0.234 


0.382 


0.276 


0.457 


0.240 


Q36 


0.355 


0.211 


0.514 


0.440 


0.367 


-0.035 


0.188 


0.234 


0.263 


0.350 


0.235 


Q37 


0.610 


0.496 


0.342 


0.613 


0.597 


0.084 


0.557 


0.644 


0.598 


0.555 


0.588 


Q38 


0.514 


0.260 


0.537 


0.499 


0.474 


-0.300 


0376 


0.364 


0.351 


0.464 


0.516 


Q39 


0.600 


0.600 


0.458 


0.759 


0.678 


0.025 


0.700 


0.757 


0.697 


0.763 


0.658 


ow 


0.400 


0.314 


0.461 


0.467 


0.409 


0.030 


0.439 


0.409 


0.397 


0.579 


0.387 




Q23 


Q24 


Q25 


Q26 


Q27 


Q28 


Q29 


Q30 


Q31 


Q32 


Q33 


Q24 


-0.375 






















Q25 


0.775 


-0.406 




















Q26 


•0.264 


0.371 


-0.275 


















027 


0.617 


-0.376 


0.520 


-0.370 
















028 


0.734 


-0.410 


0.715 


-0.159 


0.652 














029 


-0.340 


0.687 


-0310 


0.461 


-0.527 


-0.382 












030 


0.732 


-0.416 


0.640 


-0.098 


0.765 


0.73C 


-0.491 










031 


-0.354 


0.273 


-0.423 


0371 


-0.276 


-0397 


0.423 


-0.391 








032 


0.670 


-0.241 


0.672 


-0.049 


0.588 


0.872 


-0.255 


0.678 


-0.371 






033 


-0.328 


0.290 


-0.495 


0.064 


-0.202 


-0325 


0340 


-0.476 


0308 


-0.419 




Q34 


-0.452 


0.351 


-0.588 


0.279 


-0.160 


-0346 


0.234 


-0.431 


0.273 


-0.342 


0.590 


Q35. 


0.408 


-0.137 


0.251 


-0.066 


0324 


0.475 


-0199 


3.397 


-0.242 


0.581 


-0.274 


036 


0.208 


-0.207 


0.201 


-0.240 


0.247 


0.246 


-0.267 


0.364 


-0.310 


0.389 


-0.202 


037 


0.479 


-0.506 


0.403 


-0.117 


0387 


0.619 


-0.605 


0.636 


-0.438 


0.512 


-0.402 


038 


0.219 


-0.561 


0.333 


-0352 


0381 


0.400 


-0.467 


0.435 


-0.406 


0344 


-0.230 


039 


0.636 


-0.475 


0.557 


-0.238 


0.457 


0.730 


-0395 


0.642 


-0354 


0.670 


-0.294 


Q40 


0.357 


-0.116 


0368 


-0.129 


0.356 


0.383 


-0308 


0.490 


-0349 


0.467 


-0.393 




Q34 


Q35 


Q* 


Q37 


Q3S 


Q39 












035 


-0.165 






















036 


-0.057 


0353 




















037 


-0.184 


0.414 


0341 


















038 


-0.056 


0.187 


0.498 


0.665 
















Q39 


-0.219 


0.494 


0319 


0.759 


0339 














Q40 


-0.293 


0J06 


0.609 


0.440 


0.447 


0300 













26 



useful, system reliability, relevancy of information, system 
expectations of user is met, ease of use, efficiency and 
system convenience. The correlation coefficient also indicates 
that clear information, useful output, reliability, relevancy 
of information, ease of use, efficiency and system convenience 
are good predictors of system usage. 

Question 5 correlated with guestions (8, 9, 15, 18, 
20, 23, 25), which indicate that there is a strong linear 
relationship between satisfaction with the useful format of 
the output and clear information, happiness with the output, 
output relevancy, output expectations, ease of use, 
understandable output and system convenience. The correlation 
coefficients also indicates that clear information, happiness 
with the output, output relevancy, output expectations, ease 
of use, understandable output and system convenience, are good 
predictors of how satisfied the end-users will be with the 
format of the output. 

Question 7 correlated with questions (10, 13, 15), 
which indicates that there is a strong linear relationship 
between system accuracy, reliability and relevancy of 
information and therefore system reliability and relevancy of 
information are strong predictors of system accuracy. 

Question 8 correlated with questions (9, 10, 16, 18, 
19, 20, 21, 23), which indicates that there is a strong linear 
relationship between satisfaction with clarity of information 
provided by the system and satisfaction with the layout of the 

27 



output, system accuracy, output relevancy, output reliability, 
output expectations are met, ease of use and understandable 
information. 

Question 9 correlated with questions (10, 13, 15, 16, 
18, 20, 21, 23), which indicates that there is a strong linear 
relationship between end users satisfaction with the layout of 
the output and system accuracy, if end-users trust the 
information provided by the system, if the output is relevant, 
if the output is reliable, if the syster orovides the 
information the end-user needs, if the system is easy to use, 
if the reports are complete and if the outputs are easy to 
understand. 

Question 10 correlated with question (12, 13, 15, 16, 
21, 23, 28, 32, 39), which indicates that there is a strong 
linear relationship between the accuracy of the system and the 
system providing up-to-date information, reliability of the 
system, relevancy of the output provided by the system, 
completed reports, ease of understanding the output, syste 
dependability, and satisfaction with the software application. 

Question 11 correlated with questions (12, 13, 15, 16, 
19) , which indicates a strong linear relationship between the 
system providing sufficient information and the system 
providing up to date information, end-users trusting the 
information provided by the system, relevancy of the output, 
reliability of the output and end-user expectations of the 
system. 

28 



Question 12 correlated with questions (13, 16, 18, 21, 
23), which indicates a strong linear relationship between the 
system providing up-to-date information, and trusting the 
information provided by the system, reliability of the system, 
expectation of end-users of the reports provided by the 
system, completeness of reports and ease of understanding the 
output provided by the system. 

Question 13 correlated with questions (15, 16, 19, 20, 
21, 23, 28, 32), which indicates that there is a strong linear 
relationship between the end-user trusting the information 
provided by the system and the relevancy of the output, the 
reliability of the output, the expectations of the end-user of 
the system, ease of use, completed reports, and dependability 
of the system. 

Question 14 correlated with questions (15, 21) , which 
indicates a strong linear correlation between receivinq timely 
information from the Internal Revenue Service and output 
relevancy and completed reports. 

Question 15 correlated with questions (18, 19, 20, 21, 
23, 29, 30, 32, 39), which indicates that there is a stronq 
linear correlation between end-users findinq the output 
relevant and the reports beinq exactly what the user wanted, 
the system working to the end-user's expectations, system ease 
of use, system beinq easy to understand, system reliability, 
system dependability, satisfaction with software application. 



29 



Question 16 correlated with questions (19, 20, 21, 23, 
28 32) , which indicates a strong linear correlation between 
output reliability and end-user expectation of system, system 
ease of use, completed reports, and system dependability. 

Question 18 correlated with questions (20, 21, 22, 23, 
30, 39) , which indicates a strong linear relationship between 
the system providing the end-user with the reports that seem 
to be just about exactly what he/she needed and the system 
ease of use, completed reports, system efficiency, output eas 
to understand, the information content meeting the user's need 
and satisfaction with the software application. 

Question 19 correlated with questions (22, 23, 25, 28, 
39) , which indicates a strong linear relationship between the 
system meeting the expectations of the end-user and system 
efficiency, the output being easy to understand, system 
convenience system reliability and satisfaction with software 
application. 

Question 20 correlated with que_xions ( i, 22, 25, 
28) , which indicates a strong linear correlation between 
system ease of use and completed reports, system efficiency, 
system convenience and system reliability. 

Question 21 correlated with questions (22, 30, 32, 
39) , which indicates that there is a strong linear 
relationship between system ease of use and system efficiency, 
information content meeting end-user's need, system 
dependability and satisfaction with software application. 

30 



Question 22 correlated with questions (23, 25, 27, 
28) , which indicates that there is a strong linear 
relationship between system efficiency and the output being 
easy to understand, system convenience, system providing 
comprehensible information and system dependency. 

Question 23 correlated with questions (25, 28 ,30), 
which indicates that there is a strong linear relationship 
between the output being easy to understand and system 
reliability and the information contents meeting the user's 
need. 

Question 25 correlated with question 28, which 
indicates that there is a strong linear relationship between 
system convenience and system reliability. 

Question 27 correlated with question 30, which 
indicates that there is a strong linear relationship between 
the system providing comprehensible information and the 
information contents meeting the user's need. 

Question 28 correlated with questions (30, 32, 39), 
which indicates that there is a strong linear relationship 
between the system reliability and the information content 
meeting the user's need, system dependability and satisfaction 
with the software application. 

Question 37 correlated with question 39, which 
indicates a strong linear relationship between the 
satisfaction of end-users with the information/training 



31 



provided by the IRS and how satisfied they were with the 
software application. 
3. Factor Analysis 

Using the sample of 63 responses, the data was 
examined using Exploration factor analysis to summarize or 
reduce the data set of the original 40 questions. We first 
used principal component shown in Table III. 

Without specifying the number of factors, seven 
factors with eigne values greater nan one emer i. We then 
tried to delineate more clearly the clustering and grouping of 
questions than the initial principal component matrix pattern 
showed us, by the use of rotational factor analysis which is 
shown in Table IV. 

This method arrived at the terminal factors that 
satisfied our need. The first principal component, therefore, 
may be viewed as the single best summary of linear 
relationships exhibited in the data. We have determine this 
component to be RELIABILITY OF THE ELECTRONIC FILING SY8TEM 
PROGRAM. The second component is defined as the second best 
linear combination of variables, under the condition that the 
second component is orthogonal to the first. The second 
component may be defined as the linear combination of 
variables that accounts for the most residual variance after 
the effect of the first component is removed from the data, we 
have determined this component to be USER-PERCEIVED QUALITY OF 



32 



TABLE III 
INITIAL FACTOR METHOD: PRINCIPAL COMPONENTS FACTOR PATTERN 





FACTOR1 


FACTOR2 


FACTOR3 


FACTOR4 FACTOR5 


FACTOR6 FACTOR7 


Q22 


0.97856 










Q21 


0.94426 










Q15 


0.92970 










Q28 


0.90537 










Q16 


0.89998 










Q30 


0.89947 










Q8 


0.89535 










Q12 


0.88738 










Q19 


0.88544 






. 




Q13 


0.88456 










Q20 


0.88340 






. 




Q18 


0.88079 










Q7 


0.86672 










Q23 


0.86273 










Q27 


0.85528 










Q9 


0.84541 










Q32 


0.84405 










Qll 


0.83279 


0.41511 








Q10 


0.82378 










Q25 


0.80779 


-0.50846 








Q5 


0.79532 










Q38 


0.79436 






. 




Q14 


0.73555 










Q4 


0.71574 


-0.45283 








Q39 


0.69113 


0.48753 




. 




Q37 


0.65435 


0.48554 








Q3 


0.50037 




0.43292 




-0.47408 


Q2 


0.44808 


-0.43388 








Q33 


-0.51742 


0.48487 








Q29 


-0.53698 




0.50406 






Q24 


-0.63846 




0.63780 






Q36 




0.70902 




-0.42263 




Q34 


-0.50747 


0.59219 








Q6 




0.54069 


0.58198 






Q35 


. 




0.55431 


. 




Q40 


. 


0.43710 


0.52877 


-0.46819 




Q26 


-0.45608 






0.76080 




Q17 




0.60851 


. 


0.65952 




Ql 


0.46709 




, 


-0.66046 0.43660 




Q31 


-0.43766 









71425 



SOFTWARE APPLICATION. Subsequent components were defined 
similarly until all the variance in the data was exhausted, 
and we have found these subsequent components to be AMOUNT OF 
TIME PLANNED TO USE EFS. FLEXIBILITY OF THIRD PARTY SOFTWARE 



33 



TABLE IV 
ROTATION METHOD: VARIMAX ROTATED FACTOR PATTERN 





FACTOR1 


FACTOR2 


FACTOR3 


FACTOR4 


FACTOR5 


FACTOR6 


FACTOR7 


Q32 


0.91682 














Q8 


0.88589 














Q27 


0.87245 








. 






Q13 


0.86362 














Q9 


0.85791 






. 








Q16 


0.81000 


0.50287 












Q28 


0.78777 






, 








Q22 


0.77675 


0.48627 






. 


. 




Q15 


0.77256 


0.48298 












Q21 


0.76129 


0.44985 












Q14 


0.75250 










. 




Q30 


0.69381 














Q12 


0.67748 


0.58423 








. 




Q23 


0.65488 




0.41490 


-0.44276 








Q7 


0.65347 


0.55389 








. 




Q20 


0.65181 




0.46834 


-0.47713 








Q18 


0.61940 


0.58485 








. 




Q39 




0.88111 












Q37 




0.87870 












Q10 


0.45684 


0.78003 












Q38 




0.76533 












Qll 


0.54293 


0.66867 




. 


0.40481 


. 




Q5 




0.59420 


0.53699 










Q19 


0.46071 


0.58813 


0.45389 










Q29 




-0.57087 




0.55289 


, 






Q24 




-0.60358 




0.44484 








Q2 






0.81644 




. 


. 




Q4 






0.80922 










Q25 


0.53875 




0.65955 


-0.40875 




. 




Q33 


. 


. 


-0.60940 




. 




0.49373 


Q34 




. 


-0.73812 










Q26 


. 


. 




0.77261 




, 




Q17 






-0.67839 


0.70629 




. 




Ql 


0.46311 






-0.81114 








Q40 


. 


. 




. 


0.85571 






Q36 


. 


. 


, 


. 


0.79302 






Q6 




. 


, 


. 


0.42463 


0.74495 




Q35 


. 






. 




0.71561 




Q3 


. 


, 


0.45970 


. 


. 


0.56457 




Q31 














0.86373 



APPLICATION. PERCEIVED QUALITY OF INFORMATION PROVIDED BY THE 
IR8, EABE OF USE AND EFFICIENCY OF SOFTWARE AND CORRECTION 
MECHANISM. We tried further to achieve more precise and 
interpretable factors, the analysis was conducted specifying 



34 



two, three, four, five, and six factors. The most 

interpretable factors were seven. The variance explained by 

each factor is shown in Table V. 

TABLE V 
VARIANCE EXPLAINED BY EACH FACTOR 



FACTOR 1 FACTOR2 FACTOR3 FACT0R4 FACTOR5 FACT0R6 FACKK7 

12.564297 7.649598 5.625159 3.668108 2.638229 2.260931 1.836028 



FACTOR 1: Questions (32, 8, 27, 13, 9, 16, 28, 22, 15, 
21, 14, 30, 12, 23, 7, 20) are the dependent variables which 
accounts for Factor 1. All of the retained questions taken 
together seems to suggest that RELIABILITY OF THE ELECTRONIC 
FILING SYSTEM PROGRAM is the most representative issue 
identified by Factor 1 and accounts for 12.56 percent of the 
variance. The questions for Factor 1 were: 

32 - Do you find the EFS system dependable? 

8 - Is the information clear? 

27 - Does the system provide comprehensive 

information? 
13 - Do you trust the information provided by the 
system? 

9 - Are you happy with the layout of the output? 
16 - Do you feel the output is reliable? 

28 - Do you think the system is reliable? 
22 - Is the system efficient? 



35 



15 - Do you find the output relevant? 

21 - Are the reports complete? 

14 - Do you get the information you need in time 
from the IRS? 

3 - Does the information content meet your needs? 

12 - Does the system provide up to date information? 

23 - Is the output easy to understand? 
7 - Are you satisfied with the accuracy of the 
system? 

20 - Is the system easy to use? 

FACTOR 2: Questions (39, 37, 10, 38, 11, 5, 19, 24) 
are the dependent variables which accounts for Factor 2. All 
of the retained questions taken together seem to suggest that 
USER-PERCEIVED QUALITY OF SOFTWARE APPLICATION is the most 
representative issue identified by Factor 2 and accounts for 
7.6 percent of the variance. The questions for Factor 2 were: 

39 - How satisfied are you with the application? 

37 - How satisfied are you ith the 

information/training provided by the software 
vendor? 

10 - Is the system accurate? 

38 - How satisfied are you with the 

information/training provided by the tax 
consultant? 

11 - Does the system provide sufficient information? 



36 



5 - Do you think the output is presented in a 
useful format? 
19 - Does the system work to your expectations? 

24 - Is the system troublesome? 

FACTOR 3: Questions (24, 25, 33, 34) are the dependent 
variables which accounts for Factor 3. All of the retained 
questions taken together seem to suggest that AMOUNT OF TIME 
PLANNED TO USE EF8 is the most representative issue identified 
by Factor 3 and accounts for 5.6 percent of the Variance. The 
questions for Factor 3 were: 

2 - How much time do you plan to use EFS? 

4 - Do you enjoy using the system? 

25 - Is the system convenient? 

33 - Would you like the EFS system to be modified or 

redesigned? 

34 - Would you like the format modified? 

FACTOR 4: Questions (26,17,1,40,36) are the dependent 
variables which accounts for Factor 4. All of the retained 
questions taken together seem to suggest that FLEXIBILITY OF 
THIRD PARTY SOFTWARE APPLICATION is the most representative 
issue identified by Factor 4 and accounts for 3 . 6 percent of 
the variance. The questions for Factor 4 were: 

26 - Is the system difficult to interact with? 

17 - Does the system provide too much information? 
1 - Is the system flexible? 



37 



FACTOR 5: Questions (40, 36, 6) are the dependent 
variables which accounts for Factor 5. All of the retained 
questions taken together seem to suggest that PERCEIVED 
QUALITY OF INFORMATION PROVIDED BY THE IRS is the most 
representative issue identified by Factor 5 and accounts for 
2.6 percent of the variance. The questions for Factor 5 were: 

40 - How satisfied are you with the tax literature 
regarding EFS? 

3 6 - How satisfied are you with the information 
training provided by the IRS? 

FACTOR 6: Questions (6, 35, 3) are the dependent 
variables which accounts for Factor 6. All of the retained 
questions taken together seem to suggest that EASE OF USE AND 
EFFICIENCY OF SOFTWARE is the most representative issue 
identified by factor 6 and accounts for 2.2 percent of the 
variance. The questions for Factor 6 were: 

6 - Is the system difficult to operate? 

35 - Do you get feedback fast enough? 
3 - Is it easy to correct errors? 

FACTOR 7: Question (31) is the dependent variable 
which accounts for Factor 7. The retained question seem to 
suggest that CORRECTION MECHANISM is the most representative 
issue identified by Factor 7 and accounts for 1.8 percent of 
the variance. The question for Factor 7 was: 

31 - Does the information you receive require 
correction? 

38 



4. Linear Regression 

An attempt to relate the seven factors discussed in 
the previous section to the number of returns filed 
electronically has been done. We assumed that the number of 
tax returns is a function of the seven factors. The two 
conclusions that can be drawn from this analysis are: (1) The 
low usage of EFS could be explained by the lack of tax 
preparers' time devoted to EFS and (2) The fact that tax 
preparers perceived that the IRS EFS feedback is not fast 
enough. Table VI is an R-squared analysis including tax 
preparers who did not file tax returns electronically and is 
representative of the first finding. The Second finding 
represented in Table VII is an R-squared analysis and is 
representative of the tax preparers who filled electronically. 
For those tax preparers who actually used EFS, EASE OF USE, 
RELIABILITY, and PERCEIVED QUALITY OF INFORMATION, are the 
most critical issues. 

Stepwise regression was then used as an attempt to 
explain other Factors, the results were inconclusive. The 
seven interpretable Factors that satisfied our model are 
represented in Figure 2. 



39 



TABLE VI 

REGRESSION MODELS FOR DEPENDENT VARIABLE: 

MODEL: MODEL 1 



NUMBER IN 
MODEL 


K -SQUARE 


VARIABLES IN 
MODEL 


NUMBER IN 
MODEL 


R -SQUARE 


VARIABLES IN MODEL 




0.00562228 


Q1726 




0.20830243 


Q2Q32Q640Q31 




0.00831622 


Q31 




0.21433507 


Q2 Q32 01726 Q640 




0.04101621 


Q39 




0.24238564 


Q2 Q35 Q39 Q32 




0.04520710 


O640 




0.24335405 


02 Q35 Q3 1 Q32 




0.07468453 


Q32 




0.24337146 


02 Q35 Q31 Q39 




0.09423416 


035 




0.25256664 


Q2 Q35 Q1726 Q32 




0.17047368 


Q2 




0.25282133 


02 035 Q1726 Q31 




0.09933327 


Q35 Q31 




0.25305437 


Q2 Q35 Q1726 Q39 




0.09993230 


Q35 Q1726 




0.27440393 


Q2 Q35 Q640 Q39 




0.10180481 


Q35 Q39 




0.27623455 


Q2 Q35 Q640 Q32 




0.10799732 


Q32O640 




0.28062924 


Q2 Q35 Q640 Q1726 




0.10839290 


Q35 Q32 




0.28378852 


Q2 Q35 O640 Q31 




0.17385373 


Q35Q640 




0.21475496 


Q2 Q32 Q1726 Q640 Q31 




0.17491748 


Q2Q31 




0.21582297 


02 Q32 Q1726 Q640 Q39 




0.18301853 


Q2 01726 




0.24404351 


Q2 Q35 Q31 Q39 Q32 




0.18363681 


Q2Q640 




0.25283553 


Q2 Q35 Q1726 Q31 Q32 




0.18940624 


Q2 039 




0253 10962 


Q2 Q35 Q1726 039 Q31 




0.193*7928 


02 032 




0.25318591 


02 Q35 01726 Q39 Q32 




0.24048027 


02 035 




0.27715295 


02 Q35 Q640 Q32 Q39 




0.19309760 


Q2 O640 031 




0.28063931 


Q2 Q35 Q640 Q1726 Q39 




0.19384438 


02 Q32 Q31 




CU8209931 


02 Q35 Q640 Q1726 032 




0.19658716 


02 Q32 039 




0.28583723 


02 035 Q640 Q31 Q1726 




0.19836022 


02 Q39 Q1726 




0.2862O227 


02 Q35 0640 031 039 




0J0O81296 


Q2O39Q640 




0.29067577 


02 Q35 Q640 031 Q32 




1204*1*11 


02 032 0640 




0.19383792 


Q35 Q640 Q31 Q39 Q1726 




0J0645381 


Q2Q32 Q1726 




0.21588849 


02 Q32 Q1726 Q640 039 




•J404S117 


02 035 Q32 




0.25326493 


Q2 Q35 01726 Q39 Q32 Q31 




0J4187467 


02 035 039 




0.28242226 


Q2 Q35 O640 Q1726 032 039 




0.243O9462 


02 Q3S 031 




0.28785580 


Q2 Q35 Q640 Q31 039 Q1726 




0.25256486 


02 Q35 Q1726 




0.29105693 


Q2 Q35 Q640 Q31 Q32 Q39 




027436952 


Q2Q35Q640 




0-2913986 


02 035 0640 Q31 Q32 










0.29176667 


02 035 0640 Q31 032 01726 Q39 || 



40 



TABLE VII 
REGRESSION MODELS FOR DEPENDENT VARIABLE: RETURNS 

MODEL: MODEL 1 



NUMBER 
IN MODEL 


R -SQUARE 


VARIABLES IN 
MODEL 


NUMBER 
IN MODEL 


1 

R-SQUARE 


VARIABLES IN MODEL 




0.00026853 


Q2 




044412404 


Q1726Q32Q35Q31 




0.06049816 


Q31 




0.44739722 


Q32 035 Q640 Q2 




0.06761047 


Q640 




0.45296473 


032 Q35 0640 Q39 




0.06898072 


Q39 




0.45454O21 


Q1726 Q32 Q35 Q2 




0.10173493 


035 




0.45636976 


032 Q35 Q640 Q31 




0.15208803 


032 




0.48613357 


Q1726 Q32 Q640 Q31 




0.29913436 


Q1726 




0.48709248 


Q1726 Q32 O640 Q39 


2 


0.16363095 


Q640Q31 




0.49640432 


Q1726 O640 Q35 02 


2 


0.17022263 


Q32Q39 




0.49894877 


Q1726 Q32 0640 Q2 


2 


0.18520986 


Q32Q31 




0.49997110 


Q1726 O640 Q35 Q31 


2 


0.19378084 


Q32Q640 




0.51743860 


Q1726 Q640 Q35 Q39 


2 


0.26617076 


Q32Q35 




0.58720731 


Q1726 Q32 O640 Q35 


2 


0.30065383 


Q1726Q2 




0.45715772 


032 Q35 Q640 Q31 Q2 


2 


0.30168741 


Q1726 Q31 




0.46054139 


Q1726 Q32 Q35 Q2 Q31 


2 


0.31548265 


Q1726 035 




0.46271714 


Q32 035 Q640 Q31 Q39 


2 


0.32468900 


Q1726Q39 




0.48986952 


Q1726 Q32 0640 Q39 Q31 


2 


0.32859822 


Q35Q640 


5 


0.49939279 


Q1626 Q32 Q640 Q2 Q31 


2 


0.39283465 


Q1726Q640 


5 


0.49964576 


Q1726 032 Q640 02 Q39 


2 


0.41749540 


Q1726 Q32 


5 


0.50007161 


Q1726 Q640 Q35 Q31 02 


3 


0.36052634 


035 Q640 Q31 


5 


0.51747945 


Q1726 O640 Q35 Q39 Q2 


3 


0.37220185 


Q35Q640Q39 


5 


0.51927267 


Q1726 Q640 Q35 Q39 Q31 


3 


0.39584707 


Q1726Q640Q2 


5 


0.58725523 


Q1726 Q32 Q640 Q35 Q31 


3 


0.40598194 


Q1726 Q640 Q31 


5 


038914413 


Q1726 032 Q649 Q35 Q39 


3 


0.41749683 


Q1726 Q32 Q31 


5 


039682698 


Q1726 Q32 Q640 Q35 Q2 


3 


0.419S4383 


Ql 726 Q32 039 


6 


4605*303 


Q1726 032 Q35 Q2 Q31 Q39 


3 


0.42084395 


Q1726 Q640 Q39 


6 


0.46274876 


032 035 Q640 031 039 02 


3 


0.43285624 


Q1726 Q32 Q2 


6 


0.50008688 


Q1726 032 Q640 Q2 Q39 


3 


0.44235619 


Q1726 Q32 Q35 


6 


031952636 


Q1726 Q640 Q35 Q39 Q31 


3 


0.44437904 


032 Q35 Q640 


6 


0.58915215 


Q1726 032 O640 Q35 Q39 


3 


0.48261348 


Q1726 Q32 Q640 


6 


0.59699380 


Q1726 Q32 Q640 Q35 Q2 


3 


0.49576645 


Q1726Q640Q35 


6 


039736993 


Q1726 Q32 Q640 Q35 Q2 








7 


039753465 


Q1726 Q32 Q640 Q35 Q2 Q31 Q39 



41 



F.7 
Correction 
Mechanism 



F.e 

Ess* of Us* 
of Software 



F.1 

Reliability of EFS 

Program 



EFS 
USE 




F.2 
Usar Perceived 
Quality of Software 



F.5 
Perceived Quality of 

Information from IRS 




F.3 

Time Planned to 

Use EFS 



F.4 

Flexibility of Third 
Party Software 



Figure 2. Summary of the Model 



RETAINED QUESTIONS RELATED TO THE MODEL: 

RELIABILITY OF THE ELECTRONIC FLUNG SYSTEM PROGRAM 

F.l - Do you find the EFS system dependable? 

USER-PERCEIVED QUALITY OF SOFTWARE 

F.2 - How satisfied are you with the application? 

AMOUNT OF TLME PLANNED TO USE EFS 

F.3 , - How much time do you plan to use EFS? 

FLEXIBILITY OF THIRD PARTY SOFTWARE APPLICATION 

F.4 - Is the system difficult to interact with? 

F.4 - Does the system provide too much information? 

PERCEIVED QUALITY OF INFORMATION PROVIDED BY THE LRS 

F.S - How satisfied are you with the tax literature regarding EFS? 

F.5 - How satisfied are you with the information training provided by the IRS? 

EASE OF USE AND EFFICIENCY OF SOFTWARE 

F.6 - Is the system difficult to operate? 

F.6 - Do you get feedback fast enough? 

CORRECTION MECHANISM 

F.7 - Does the information you receive require correction? 



42 



VI. SUMMARY: 

A. SUMMARY OF RESULTS 

This study represents significant findings in the Factors 
end-users perceived are important in the measurement of end- 
user computing satisfaction. Our statistical analysis confirm 
some of the verbal recommendations made by the tax preparers 
surveyed. Training appears to be one of the most important 
Factors to convince the tax preparers of the reliability of 
EFS . The scores obtained for the questions representing Factor 
1 (RELIABILITY OF THE EFS PROGRAM) were approximately 3.7, we 
suspect that the tax preparers should be encouraged to devote 
more time in getting acquainted with the documentation 
provided by the IRS which was generally perceived as 
satisfactory. Overall, all of the questions representing the 
seven Factors received high scores from the survey. This 
suggests that we should find other Factors that could be used 
to infuse more incentive in EFS usage. The survey indicates 
that cost is a significant factor. This is particularly true 
for small firms who perceive that the initial investment is 
too costly for EFS use. 

B. SUGGESTIONS FOR FUTURE RESEARCH 

Our empirical research confirmed the redundancy of some of 
the questions from the original 40 questions identified by 



43 



Doll and Torkzadeh. It was expected that some of the questions 
would be grouped together, however, the factor analysis spread 
them over the seven factors. This scattering seems to imply 
that either these questions were not properly devised or the 
respondents were somewhat confused as to the wording of some 
of the questions. 



44 



VII. CONCLUSION 

f 

As the third party software for EFS increases in quality 
and reliability and as the EFS program starts its fifth year, 
we contend that technology will no longer be an implementation 
issue. Therefore, successful promotion of EFS would depend on 
the ability of the IRS to motivate the public at large, and as 
a consequence the tax preparers, recognize the real benefits 
of EFS. 



45 



APPENDIX 



END-USER SATISFACTION SURVEY 

OF THE 

IRS ELECTRONIC FILING SYSTEM PROGRAM (EFS) 



Naval Postgraduate School 
Monterey, CA 93943-5000 



To: 



We are trying to determine the factors to be considered to ensure a 
successful implementation of the EFS program. Enclosed is a questionnaire 
adopted from the MIS literature to measure end-user computing satisfaction. 
You may find some of the questions redundant or closely related to one another 
for statistical purposes. 

Please take a few minutes to complete this survey and return it to us in 

the stamped, self-addressed envelope by . We guarantee that 

all the information you provided will be held confidential. If you would like a 
copy of the completed study please indicate on the last page of the survey. 

Thank you for your time. 



Margaret Y. Hall 

Lt USN, Graduate Student 



Professor Tung Bui, PhD 
Thesis Advisor 



End. (4) 



46 



APPENDIX 



Section I 



Name of interviewee: 
Address: 



Name of Company: 

Your Function at Company: 



Years of experience with the Electronic Filing System: 

Number of tax preparers in your company: 

Number of tax preparers dealing with EFS: 



Number of tax preparers planning to use EFS in the near 
future: 

Hardware Environment: 

Type of computer : 



Type of modem used (baud): 9600. 

4800 



1200. 
300 



Number of modems used: 



Type of Operating System: /Version: 

Section II 

MEASURES OF END-USER COMPUTING SATISFACTION 

Please enter the name of software application used for the Electronic 

Filing System. (If more than one 

software application is used please 

specify: ) 



For each of the following questions please mark an X in the box which best 
describe your satisfaction with this application, using a 0-5 scale where: = Not 
Applicable (N/A); 1 = almost never; 2 = some of the time; 3 = about half of the 
time; 4 = most of the time; and 5 = almost always. 



47 



APPENDIX 



= Not Applicable 

1 = Almost Never 

2 = Some of the time 

3 = About half of the time 

4 = Most of the time 

5 = Almost always 









1 


2 


3 


4 


5 


1. Is the system flexible? 














2. How much time do you plan to use EFS? 














3. Is it easy to correct the errors? 














4. Do you enjoy using the system? 














5. Do you think the output is presented in a useful format? 














6. Is the system difficult to operate? 














7. Are you satisfied with the accuracy of the system? 














8. Is the information clear? 














9. Are you happy with the layout of the output? 














10. Is the system accurate? 














11. Does the system provide sufficient information? (e.g. 
documentation, on line help) 














12. Does the system provide up-to-date information? (e.g. software 
upgrade, information regarding new IRS protocols, etc.) 














13. Do you trust the information provided by the system? 














14. Do you get the information you need in time from the IRS? 














15. Do you find the output relevant? 














16. Do you feel the output is reliable? 














17. Does the system provide too much information regarding usage 
of the software? 














18. Does the system provide reports that seem to be just about 
exactly what you need? 














19. Does the system work to your expectations? 














20. Is the system easy to use? 














21. Are the reports complete? (e.g. carbon copy, statistical report on 
system usage, etc.) 














22. Is the system efficient? 















48 



APPENDIX 



= Not Applicable 

1 = Almost Never 

2 = Some of the time 

3 = About half of the time 

4 = Most of the time 

5 = Almost always 









1 


2 


3 


4 


5 


. Is the output easy to understand? 














L Is the system troublesome? 














L Is the system convenient? 














1 Is the system difficult to interact with? 














L Does the system provide comprehensive information? 














\. Do you think the system is reliable? 














. Would you like more concise output? 














L Does the information content meet your needs? 














.. Does the information you receive require correction? 














!. Do you find the EFS system dependable? 














. Would you like the EFS system to be modified or redesigned? 














L Would you like the format modified? 














>. Do you get feedback fast enough? (e.g. acknowledged receipt of 
electronic filing). 














i. How satisfied are you with the information/training provided 
by the IRS? 














\ How satisfied are you with the information/training provided 
by the software vendor? 














\. How satisfied are you with the information/training provided 
by the tax consultants? 














1. How satisfied are you with the application? 














). How satisfied are you with the tax literature regarding EFS? 















49 



APPENDIX 

Section III 

OPEN ENDED QUESTIONS 

41. What aspects of the application, if any, are you most satisfied with? 



42. What aspects of the application, if any, are you most dissatisfied with? 



43. In your opinion what are the most important factors in promoting a successful 
use of EFS in your company? 



44. Other comments: 



45. Would you like to receive an executive report of this survey? 

YES NO 

Thank you for your cooperation, 

Lt. Margaret Y. Hall 

Professor Tung Bui, PhD 



50 



REFERENCES 



1. Doll W.J. and Torkzadeh G. , "The Measurement of End-User 
Computing Satisfaction." MIS Quarterly . 12(2), (1988) 259- 
274. 

2. Singleton J. P., McLean E.R. and Altman E.N. , "Measuring 
Information Systems Performance: Experience with the 
Management Results System at Security Pacific Bank," MIS 
Quarterly . 12(2), (1988), 325-337. 

3. Ein-Dor P. and Segev E., "Information Resources Management 
for End-User Computing: An Exploratory Study," Information 
Resources Management Journal . 1(1), (1988), 39-46. 

4. Igbaria M. and Nachman S., "Correlates of User 
Satisfaction with End-user Computing: An Exploratory 
Study," Information & Management , (in press). 

5. Rivard S. and Huff S.L., "Factors of Success for End-User 
Computing." Communications of the ACM . 31(5), (1988), 552- 
561. 

6. DeLone W.H., "Determinants of Success for Computer Usage 
in Small Business," MIS Quarterly . 12(1), 1988, 51-61. 

7. Gupta Y.P., Guimaraes T. and Raghunathan T.S., "Personal 
Computing Problems and Some Organizational Factors: A 
Multivariate Analysis," Computers & Operations Research , 
16(5), (1989), 419-430. 

8. Igbaria M. , Parasuraman S. and Pavri F.N. , "A Pathe 
Analytic Study of the Determinants of Microcomputer 
Usage," Journal Of Management Systems , (in press). 

9. Igbaria M. , Pavri F.N. and Huff S.L., "Microcomputer 
Applications: An Empirical Look at Usage," Information & 
Management . 16(4), (1988), 187-196. 

10. Lucas H.C. Jr., "Empirical Evidence for a Descriptive 
Model of Implementation," MIS Quarterly . 2(2), (1978), 27- 
41. 

11. Srinivasan A., "Alternative Measures of System 
Effectiveness: Associations and Implications," MIS 
Quarterly . 9(3), (1985), 243-253. 



51 



12. Commissioner's Briefing on Electronic Filing Systems, 
March 28, 1990, Revision 1 



52 



BIBLIOGRAPHY 

Commissioner's Briefing on Electronic Filing Systems, March 
28, 1990, Revision 1. 

DeLone W.H. , "Determinants of Success for Computer Usage in 
Small Business," MIS Quarterly . 12(1), 1988, 51-61. 

Doll W.J. and Torkzadeh G. , "The Measurement of End-User 
Computing Satisfaction." MIS Quarterly . 12(2), (1988) 259-274. 

Ein-Dor P. and Segev E., "Organizational Context and the 
Success of Management Information Systems," Management 
Science . 24(10), (1978), 1064-1077. 

Ein-Dor P. and Segev E., "Information Resources Management for 
End-User Computing: An Exploratory Study," Information 
Resources Management Journal . 1(1), (1988), 39-46. 

Ghani J. A. and Al-Meer A.R. , "Effect of End-User Computing on 
Job Satisfaction: An Exploratory Study," Information & 
Management . (in press) . 

Gupta Y.P., Guimaraes T. and Raghunathan T.S., "Personal 
Computing Problems and Some Organizational Factors: A 
Multivariate Analysis," Computers & Operations Research . 
16(5), (1989), 419-430. 

Igbaria M. and Nachman S., "Correlates of User Satisfaction 
with End-user Computing: An Exploratory Study," Information & 
Management . (in press) . 

Igbaria M. , Parasuraman S. and Pavri F.N. , "A Pathe Analytic 
Study of the Determinants of Microcomputer Usage," Journal Of 
Management Systems , (in press) . 

Igbaria M. , Pavri F.N. and Huff S.L., "Microcomputer 
Applications: An Empirical Look at Usage," Information & 
Management . 16(4), (1988), 187-196. 

Lucas H.C. Jr., "Empirical Evidence for a Descriptive Model of 
Implementation." MIS Quarterly . 2(2), (1978), 27-41. 

Rivard S. and Huff S.L., "Factors of Success for End-User 
Computing." Communications of the ACM . 31(5), (1988), 552-561. 

Singleton J. P., McLean E.R. and Altman E.N. , "Measuring 
Information Systems Performance: Experience with the 



53 



Management Results System at Security Pacific Bank," MIS 
Quarterly . 12(2), (1988), 325-337. 

Srinivasan A., "Alternative Measures of System Effectiveness: 
Associations and Implications," MIS Quarterly . 9(3), (1985), 
243-253. 



54 



INITIAL DISTRIBUTION LIST 



1. Defense Technical Information Center 
Cameron Station 

Alexandria, Virginia 22304-6145 

2. Library, Code 52 

Naval Postgraduate School 
Monterey, California 93943-5000 

3. Professor Tung X. Bui 
Naval Postgraduate School 
Department of Administrative 
Code AS/Bd 

Monterey, California 93943-5000 

4 . Internal Revenue Service 
San Jose District 

55 S. Market St. 

San Jose, California 95113 

5. Naval Military Personnel Command 
Washington, D.C. 20370 

Attn: Margaret Yvonne Hall 
NMPC 1631E 



55 



v^7- 



Thesis 

H1497 

c.l 



Hall 

Measuring the perceived 
ef f ectivene-ss of the 
Internal/Kevenue Service's 
(IRS) ydlrect filing 
system from the end-user 
perspective. 




DUDLEY KNOX LIBRARY 




3 2768 00014401 8